453 research outputs found

    Normalisation Control in Deep Inference via Atomic Flows

    Get PDF
    We introduce `atomic flows': they are graphs obtained from derivations by tracing atom occurrences and forgetting the logical structure. We study simple manipulations of atomic flows that correspond to complex reductions on derivations. This allows us to prove, for propositional logic, a new and very general normalisation theorem, which contains cut elimination as a special case. We operate in deep inference, which is more general than other syntactic paradigms, and where normalisation is more difficult to control. We argue that atomic flows are a significant technical advance for normalisation theory, because 1) the technique they support is largely independent of syntax; 2) indeed, it is largely independent of logical inference rules; 3) they constitute a powerful geometric formalism, which is more intuitive than syntax

    A System of Interaction and Structure

    Full text link
    This paper introduces a logical system, called BV, which extends multiplicative linear logic by a non-commutative self-dual logical operator. This extension is particularly challenging for the sequent calculus, and so far it is not achieved therein. It becomes very natural in a new formalism, called the calculus of structures, which is the main contribution of this work. Structures are formulae submitted to certain equational laws typical of sequents. The calculus of structures is obtained by generalising the sequent calculus in such a way that a new top-down symmetry of derivations is observed, and it employs inference rules that rewrite inside structures at any depth. These properties, in addition to allow the design of BV, yield a modular proof of cut elimination.Comment: This is the authoritative version of the article, with readable pictures, in colour, also available at . (The published version contains errors introduced by the editorial processing.) Web site for Deep Inference and the Calculus of Structures at <http://alessio.guglielmi.name/res/cos

    On Structuring Proof Search for First Order Linear Logic

    Full text link
    Full first order linear logic can be presented as an abstract logic programming language in Miller's system Forum, which yields a sensible operational interpretation in the 'proof search as computation' paradigm. However, Forum still has to deal with syntactic details that would normally be ignored by a reasonable operational semantics. In this respect, Forum improves on Gentzen systems for linear logic by restricting the language and the form of inference rules. We further improve on Forum by restricting the class of formulae allowed, in a system we call G-Forum, which is still equivalent to full first order linear logic. The only formulae allowed in G-Forum have the same shape as Forum sequents: the restriction does not diminish expressiveness and makes G-Forum amenable to proof theoretic analysis. G-Forum consists of two (big) inference rules, for which we show a cut elimination procedure. This does not need to appeal to finer detail in formulae and sequents than is provided by G-Forum, thus successfully testing the internal symmetries of our system.Comment: Author website at http://alessio.guglielmi.name/res

    A Linear Logic View of Gamma style Computations as proof searches

    Get PDF

    A System of Interaction and Structure IV: The Exponentials and Decomposition

    Get PDF
    We study a system, called NEL, which is the mixed commutative/non-commutative linear logic BV augmented with linear logic's exponentials. Equivalently, NEL is MELL augmented with the non-commutative self-dual connective seq. In this paper, we show a basic compositionality property of NEL, which we call decomposition. This result leads to a cut-elimination theorem, which is proved in the next paper of this series. To control the induction measure for the theorem, we rely on a novel technique that extracts from NEL proofs the structure of exponentials, into what we call !-?-Flow-Graphs

    Deep Inference

    Get PDF
    Deep inference could succinctly be described as an extreme form of linear logic [11]. It is a methodology for designing proof formalisms that generalise Gentzen formalisms, i.e. the sequent calculus and natural deduction [10]. In a sense, deep inference is obtained by applying some of the main concepts behind linear logi

    The Impact of Learning Strategies and Future Orientation on Academic Success: The Moderating Role of Academic Self-Efficacy among Italian Undergraduate Students

    Get PDF
    Promoting academic success among undergraduate students is crucial for tackling the need to foster employability competencies. Low levels of academic attainment in higher education, along with the increasing number of persons participating in tertiary education, represent crucial trends, which need to be studied in order to develop efficient retention practices. The current study aimed to investigate the relationship between relevant factors that can foster academic success: learning strategies, future orientation, and academic self-efficacy. To this purpose, a longitudinal study was performed on a sample of N = 87 undergraduate students from one of the largest Italian universities (63.4% males, 74.2% enrolled in the first year). Participants filled in an online questionnaire at two different time points, with a time lag of 12 months. Results of a moderated mediation model indicated that the relationship between learning strategies at Time 1 (T1) and Grade Point Average (GPA) at Time 2 (T2) was mediated by students\u2019 future orientation. Moreover, this association was moderated by T1 academic self-efficacy. These results suggest that learning strategies positively influence GPA through an enhanced future orientation, in particular when students report high or medium levels of self-efficacy. The current findings invite a thorough review of training interventions for improving academic achievement

    On the proof complexity of deep inference

    Get PDF
    International audienceWe obtain two results about the proof complexity of deep inference: (1) Deep-inference proof systems are as powerful as Frege ones, even when both are extended with the Tseitin extension rule or with the substitution rule; (2) there are analytic deep-inference proof systems that exhibit an exponential speedup over analytic Gentzen proof systems that they polynomially simulate

    Subatomic Proof Systems: Splittable Systems

    Get PDF
    This paper presents the first in a series of results that allow us to develop a theory providing finer control over the complexity of normalisation, and in particular of cut elimination. By considering atoms as self-dual non-commutative connectives, we are able to classify a vast class of inference rules in a uniform and very simple way. This allows us to define simple conditions that are easily verifiable and that ensure normalisation and cut elimination by way of a general theorem. In this paper we define and consider splittable systems, which essentially comprise a large class of linear logics, including MLL and BV, and we prove for them a splitting theorem, guaranteeing cut elimination and other admissibility results as corollaries. In papers to follow, we will extend this result to non-linear logics. The final outcome will be a comprehensive theory giving a uniform treatment for most existing logics and providing a blueprint for the design of future proof systems.Comment: 32 page
    corecore